Skip to content

kubectl-ailogs#1449

Open
Pranav-V wants to merge 1 commit intocloud-ark:masterfrom
Pranav-V:feature/ailogs
Open

kubectl-ailogs#1449
Pranav-V wants to merge 1 commit intocloud-ark:masterfrom
Pranav-V:feature/ailogs

Conversation

@Pranav-V
Copy link
Copy Markdown

@Pranav-V Pranav-V commented Dec 4, 2025

PR adds LLM-enhanced runtime log analysis.

TODO:

  • make plugins/ai-analysis/Dockerfile public
  • code linting / hygiene

@devdattakulkarni devdattakulkarni marked this pull request as ready for review April 29, 2026 21:55
- "--secret"
- "webhook-tls-certificates"
containers:
- name: ollama-ai
Copy link
Copy Markdown
Contributor

@devdattakulkarni devdattakulkarni Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be put inside a condition check (ENABLE_AI_ANALYSIS==true). The variable will be defined in the Helm chart.

@@ -0,0 +1,61 @@
from flask import Flask, request, jsonify
from ollama import generate
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will it be possible to make this configurable to use different models (Claude, OpenAI, Gemini, etc.)?

def cr_ai_logs():
data = request.get_json(force=True)
logs = data.get("logs", "")
prompt = """
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the best prompt that we can construct? How do the answers get affected by the prompt?

@devdattakulkarni devdattakulkarni changed the title [WIP] kubectl-ailogs kubectl-ailogs Apr 29, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants